A Fast Descent Method for theHydro Storage
نویسنده
چکیده
For many years energy optimization has dealt with large scale mixed integer linear programs. The paper concentrates on programs that are used for controlling an existing generation system consisting of thermal power units and pumped hydro storage plants, therefore they should be solved in real time. The problem can be decomposed into smaller problems using Lagrangian Relaxation. One of these problems is still a large scale multistage problem and it handles with pumped hydro storage plants only. In this paper, this problem is investigated down to the smallest details. The objective function for this problem is a linear function but stochastic. Using the special structure of the constraints, a solution method based on a subset of descent directions was developed. This method was compared with an available standard software for multistage linear programs. iii
منابع مشابه
Accelerating Stochastic Gradient Descent using Predictive Variance Reduction
Stochastic gradient descent is popular for large scale optimization but has slow convergence asymptotically due to the inherent variance. To remedy this problem, we introduce an explicit variance reduction method for stochastic gradient descent which we call stochastic variance reduced gradient (SVRG). For smooth and strongly convex functions, we prove that this method enjoys the same fast conv...
متن کاملA Fast Distributed Stochastic Gradient Descent Algorithm for Matrix Factorization
The accuracy and effectiveness of matrix factorization technique were well demonstrated in the Netflix movie recommendation contest. Among the numerous solutions for matrix factorization, Stochastic Gradient Descent (SGD) is one of the most widely used algorithms. However, as a sequential approach, SGD algorithm cannot directly be used in the Distributed Cluster Environment (DCE). In this paper...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملFast Supervised Discrete Hashing and its Analysis
In this paper, we propose a learning-based supervised discrete hashing method. Binary hashing is widely used for large-scale image retrieval as well as video and document searches because the compact representation of binary code is essential for data storage and reasonable for query searches using bit-operations. The recently proposed Supervised Discrete Hashing (SDH) efficiently solves mixed-...
متن کاملAn eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
متن کامل